Scammers are always trying to stay one step ahead of the rest of us when it comes to using technology for their schemes. It has been widely reported that scams are evolving by using artificial intelligence (AI) voice cloning. So, how does it work? Voice cloning is essentially when someone uses the recording of a person’s voice and uses AI to change or manipulate the message. Ìý
In February 2024, the Federal Communications Commission (FCC) declared that AI generated robocalls are illegal to prevent the use of nefarious voice cloning. This will enable company fines and blocking of these calls, but it is still important to be aware of AI generated scams.ÌýÌý
The scam that sparked major concern around the world was imitating President Biden, just before the New Hampshire primary election. The calls were traced to a company in Texas called the Life Corporation, who had sent out up to 25,000 of these robocalls with a voice-cloned message of Biden telling people to not vote in the primary. If you heard the message, it does sound similar to Biden as it even uses some of his catchphrases. The information in the call is alarming – a politician telling citizens to not vote is very questionable – but it does sound scarily legitimate.ÌýÌý
Another dangerous AI scam reported happened to a couple in Brooklyn, NY in March 2024. The couple received a call in the middle of the night from what sounded like a family member’s voice and a person who was holding her ransom. The family member was in a panic and then a man came on the phone who was demanding they send money to a Venmo account. The caller was demanding a small amount of money for a ransom – $500 – so the couple sent it in exchange for their family member’s safety. The call continues when the caller asks for more money transfers. After about 25 minutes on the phone going back and forth with the caller, he ended the call by telling them to call the family member to prove they are safe. When they did, the family member had no idea what they were talking about – it was all a scam. The caller had mimicked the family member’s voice at the beginning of the call using AI to trick the couple.ÌýÌý
According to Ä¢¹½ÊÓƵ survey data, 73% of US adults are concerned about AI-generated deepfake robocalls that mimic the voice of a loved one to try and scam them out of money and 51% have received or know of someone who has received an AI-generated deepfake robocall.ÌýÌý
The FCC ruling to ban robocalls using AI will hopefully help prevent some of these situations from happening, but bad actors may still ignore the ruling. If you receive a suspicious call from someone you know, contact them directly to verify if the call is real. If you believe you have received or been the victim of an AI robocall scam, you can report it to the FCC .ÌýÌý
In 2023, Ä¢¹½ÊÓƵ announced the launch of AI Labs. The initiative will help Ä¢¹½ÊÓƵ’ carrier and enterprise customers tap into the full potential of advanced artificial intelligence technologies to help restore trust in communications, better analyze call traffic activity through voice biometrics and protect consumers from the growing use of nefarious AI voice cloning robocalls by bad actors.  Ìý
It is best practice to never engage with unknown numbers and report phone numbers being used by scammers to your carrier. If you believe you are the victim of a scam, you can report it to your local police, state attorney’s general office and the .Ìý
Call-blocking apps, including those powered by Ģ¹½ÊÓƵ Call Guardian®, are also a great resource for reporting and blocking unwanted robocalls. Stay vigilant, share information about scams with others and be sure to check out our monthly Scam of the Month page updates. Ìý
John Haraburda is Product Lead for Ä¢¹½ÊÓƵ Call Guardian® with specific responsibility for Ä¢¹½ÊÓƵ’ Communications Market solutions. Ìý
Call Guardian is a registered trademark of Transaction Network Services, Inc. Ìý